Market Roundup August 12, 2005 HP and Red Hat Challenge Blade Pricing
Conventions Novell and Open Source: It’s the
Model, not the Technology |
|
HP and Red Hat Challenge Blade Pricing
Conventions
At LinuxWorld in San Francisco this week, HP and Red Hat
announced a new pricing strategy, a blade server bundle, for HP blades running
Red Hat Linux. The companies intend to
sell a per chassis license for management software and up to eight servers
running Red Hat’s Enterprise Linux rather than license these on a per server or
per processor basis. HP and Red Hat have provided no further details at this
point, although HP suggested that pricing could be up to 20% lower than
individual licenses. The two companies are suggesting that this is a futuristic
model for shipping flexible combinations of software and hardware.
While this announcement is currently more of a statement of
direction and intention than a rollout of a polished pricing structure, it’s
fairly clear that HP and Red Hat have set themselves on a different road to
licensing heaven. The idea as it stands has both positive and negative aspects.
On the positive side, simplified pricing could ease the burden for staff
responsible for license management. Licensing has generally measured either a
number of users or a number of processors to determine price. In the classic IT
world this was fairly straightforward. In the new world of virtualization,
on/off capacity on demand, and dual-core processors, there is a lot more room for
interpretation. License management is not insignificant for many large firms,
and blades are an area where multiple servers handling many changing workloads
are common. This space could serve as a good Petri dish for server pricing
experiments for an audience who recognizes the growing problem and will
appreciate proffered solutions. On the other hand, blade chassis are not
designed to handle servers only. The IBM BladeCenter in particular has the
Blade.org initiative designed to help partners like NetApp
or Brocade design a range of products for the BladeCenter chassis. A blade
server bundle would only work on those chassis that were intended to have
exclusively servers on them. Customers will have to work carefully with their
vendors to make sure that they’re getting the right package for the blade
chassis. It is a good first step, but it is not a pricing panacea.
At the same time, we roundly applaud any attempt to help rationalize software pricing in light of changing technology capabilities. Physical licenses tied to virtual infrastructure are a recipe for trouble. We are certain that the path will not be direct, but we do believe it is important that vendors start to realign their software pricing to hardware realities sooner rather than later. Vendors such as Computer Associates have already started on different pricing structures. For example, CA now offers BrightStor Managed Capacity Licensing based on terabyte capacity that eliminates the need to track server or application-based licenses. Users pay for the total amount of storage they will manage. These initiatives may not work for every customer, and most of them are aimed at high-end customers who are finding license management to be a significant aspect of data-center complexity. However, we believe that as with many IT initiatives, vendors will quickly discover that mid-market customers with smaller volumes can benefit just as much as their larger customers, and will tailor the programs to accommodate them as well. We are pleased to see that HP has begun talks with other OS companies about similar arrangements. We strongly urge those companies to work with HP to try new models and help lead the industry forward.
Novell and Open Source: It’s the Model,
not the Technology
Novell made a series of announcements at LinuxWorld this week including a number that have to do with open source development strategies. The company formally announced the openSuSE project, by which developers will gain access to the latest versions of SuSE Linux. Under the program, the community development effort will be frozen at a point in time, with modifications being tested and approved, and the release will then be put out again for further development. Novell also announced that it is offering migration tools for Netware and Windows users who want to move file and print functions to Linux. The company also announced that it has reached an agreement with MySQL AB by which Novell will deliver enhanced services and support for MySQL as well as offering subscriptions to MySQL database service directly to its customers. Finally, the company announced a new go-to-market campaign with open source application partners as part of its Market Start program. These providers will be backed by Novell’s support.
Certainly none of these types of developments is new. IBM
has been at this type of thing for some time, emphasizing partners and Linux as
a means to expand its revenue opportunities. Other vendors are also beginning
to take a new look at not only Linux but the means in which products are
developed and distributed. This trend is gaining momentum and shows no signs of
slowing down.
Our interest lies not just in the technology developments themselves, but the means by which they are being achieved. A recent article noted that Linux was going through growing pains as customers demanded more features and applications. In our mind, such growing pains are an unqualified indicator that Linux momentum is accelerating. Imagine the opposite: if customers were saying “don’t bother,” one could argue that Linux is losing ground. To meet this demand companies like Novell, Red Hat, IBM, and others are realizing that community development efforts in an open source environment not only speeds development but allows for broad and more quirky innovation as new ideas are much easier to promulgate in an open source environment than in a closed development shop. Developers with good ideas can turn those ideas into code without hacking their way through corporate design hierarchies and philosophies. With the ongoing development and enthusiasm for LAMP and its related technologies, we see no reason to believe that any sort of momentum loss is in the offing. In fact, the opposite would seem to be the case. As LAMP gains stature and support, look for broader open source development efforts to continue and grow. It just makes good business sense.
Bringing Virtual Standards into
Reality
VMware has announced that it is working with AMD, BEA
Systems, BMC Software, Broadcom, Cisco, Computer Associates International,
Dell, Emulex, HP, IBM, Intel, Mellanox, Novell, QLogic, and Red Hat to advance
open virtualization standards. The initiative is focused on accelerating open
standards for virtualization and VMware will contribute technologies to this
effort. VMware will contribute an existing framework of interfaces, known as
the Virtual Machine Hypervisor Interfaces, based on its commercially available
virtualization products to facilitate the development of these standards in an
industry-neutral manner. The company
also announced that it would provide its partners access to VMware ESX Server
source code and interfaces under a new program called VMware Community Source
that seeks partners’ input on the future direction of VMware ESX Server. The
Community Source program offers partners access to VMware ESX Server source
code under a royalty-free license and they can contribute shared code or create
binary modules intended to spur and extend interoperable and integrated
virtualization solutions. VMware also stated that it believes customers will
benefit from an expanded ecosystem of virtualization solutions, availability of
open standard virtualization interfaces including hypervisors, and more rapid
availability of new virtualization aware technologies.
Virtualization is a hot topic. No news here. However in the
heat of pitched marketing battles over the definition and ultimate control of
the word virtualization, the market has borne witness to a plethora of technobabble and marketing speak as each vendor tries to
parlay the virtualization opportunity to its own advantage. Frankly, in the
midst of this cross fire it is amazing that virtualization has done as well as
it has. We believe that VMware’s initiative is a
significant, if not essential undertaking that is necessary to help vendors and
customers maximize the value and opportunity of embracing virtualization
technology. Given the emphasis on consolidation and simplification in data
centers across the spectrum of companies, virtualization plays a key role in
improving the efficiency and ROI of Industry Standard servers. With utilization
rates as low as a paltry 5% on many x86 servers, virtualization offers a
considerably positive efficiency enhancement. But for end-user organizations to
ultimately glean the maximum value from virtualization, it must go beyond
simply the server and must operate in a heterogeneous environment.
The value of the standards being pursued by VMware while encouraging and standardizing multiple vendors’ virtualization solutions ultimately should offer reassurance to the end user that standards-compliant virtualization solutions will interoperate without inflicting vendor-specific technology lock in. In addition, we believe such standards will allow other virtualization/hypervisor schemes to get along in the virtualized IT reality we see in the future. The participation by most of the relevant vendors is encouraging in this regard; however, the absence of Microsoft is notable. While Microsoft plans to make virtualization an important part of the next release of Windows, its lack of participation in this initiative could spell interoperational difficulty if the Redmond Giant attempts to drive a competing and incompatible virtualization standard into the marketplace. Given the importance of virtualization to the future of IT and business, we sincerely hope that such a split does not occur. In this day and age, IT efficiency and technological peaceful coexistence are the future, not fractured vendor-specific solutions
George Mason University researchers have been awarded a
$307,436 grant to create a tool that would allow law enforcement agencies to tap
VoIP calls. The grant, from the National Science Foundation, is designed to
help create such a tool from research done at the university. The grant comes
in the wake of an FCC ruling that requires some VoIP providers to build their
systems so that wiretaps could be applied if asked for by law enforcement
agencies. Those agencies have been requesting such capabilities since VoIP has
gained currency in the marketplace. The theory behind the potential technology
indicates it would be possible to “watermark” individual packets to be able to
identify the speakers in the conversation. According to published reports, the
George Mason technology would not be able to decrypt the packets sent.
While law enforcement agencies look to such technology to
help track down criminals of all stripes, privacy advocates warn of potential
abuses of such tools. Such abuses have occurred in the past and undoubtedly
will do so in the future. On the other hand, numerous criminal convictions have
been obtained through wiretap evidence, and one expects that same to be true in
the future. While VoIP users may not be thrilled with the fact that they too will
be subject to wiretaps, they may as well just get used to the idea becoming
reality.
For anything short of that would reverse a long-standing trend that is not going to be discarded anytime soon. Soon after landline telephones became a household staple, federal and local law enforcement got permission (in most cases) to attach a wire to any particular phone line and listen in. With the advent of the cell phone, simple scanners were all that were required to listen into analog phone calls on cells. Digital cell phone resolved that problem but they too now are tracked and monitored on a number of levels. The British government went so far as to request that British mobile phone companies surreptitiously download software to suspect mobile phones that turn the phone microphone on without the user’s knowledge, turning the phone into a mobile listening device. Concerns over the ability to tap VoIP calls parallel the concerns of these earlier technologies, and from our point of view these concerns should be seen in a similar light. If there is abuse of the requests for wiretaps, then a problem exists, regardless of the technology in question. VoIP is just another way to deliver phone service; it needs to meet the minimum standards for law enforcement, just like any other phone service.
EU’s RoHS Directive Starts Showing Itself
Dutch customs agents this week stopped a potentially
dangerous shipment from coming into their country: 1.3 million Sony
PlayStations. Sony had to repeal the
shipment, repackage all PlayStations, and fix the problem before
reshipping. The
problem? Too much cadmium used in the building of the PlayStations. The EU is cracking down on toxic tech and companies
have until July 1, 2006 to comply with the environmentally friendly RoHS directive.
Among the soon-to-be-banned substances are lead, mercury, cadmium, hexavalent chromium, PBB, and PBDE. There is an out clause to use as backup for
when no other substitutes will do.
Toxic technology is not a new idea. In 1982, the Silicon Valley Toxics Coalition
formed and began spearheading efforts to recycle computers, raise awareness,
and develop safer, more earth-friendly technologies. However, in the U.S. chemicals have not been
banned, as with the RoHS directive. Even though the Swedish have had a computer
take-back system since 1999, and the RoHS directive
entered into force on February 13, 2003, it is still argued that some companies
won’t be able to comply by the July 1, 2006 deadline. The PlayStation debacle is a good example of
companies’ lack of preparation.
As recently as thirty years ago, the air in major North American cities was unfit to breathe due to automobile exhaust. It took legislation and technology changes on the parts of automakers to stop the pollution of our environment from cars and trucks, but the difference is dramatic. The EU is taking a step in the right direction by legislating the technology industry, but it’s up to the rest of the world to follow suit. Currently, the major producers of hardware — HP, IBM, Lenovo — have computer reclamation programs in place, but the word needs to get out to the public to take advantage of these programs. Case in point: school districts when they are lucky enough to buy or receive new computers often put the old ones up for auction. But the age of these post-educational devices dictates that most ultimately end up in landfill, given the largely quiet availability of reclamation programs. The IT industry needs to go further than low-key reclamation programs though, and develop their own version of catalytic converters in the form of non-toxic technology. It can be done, and the EU is forcing the issue. Companies will comply or face the consequences, but they can make a virtue out of necessity. Rather than simply exporting toxic waste to underdeveloped countries that will take the trash in order to earn money, companies could instead actively promote reclamation, thus eliminating the need for regulatory intervention. Given the market response to green product initiatives, there should be a ready market in the U.S. and other industrialized countries for earth-friendly technology. Ultimately, industry compliance with EU directives such as this will be for everyone’s benefit, especially those whose backyards are growing mounds of toxic waste exported from the developed world.